F-divergence Is a Generalized Invariant Measure between Distributions
نویسندگان
چکیده
Finding measures (or features) invariant to inevitable variations caused by non-linguistical factors (transformations) is a fundamental yet important problem in speech recognition. Recently, Minematsu [1, 2] proved that Bhattacharyya distance (BD) between two distributions is invariant to invertible transforms on feature space, and develop an invariant structural representation of speech based on it. There is a question: which kind of measures can be invariant? In this paper, we prove that f -divergence yields a generalized family of invariant measures, and show that all the invariant measures have to be written in the forms of f -divergence. Many famous distances and divergences in information and statistics, such as Bhattacharyya distance (BD), KL-divergence, Hellinger distance, can be written into forms of f -divergence. As an application, we carried out experiments on recognizing the utterances of connected Japanese vowels. The experimental results show that BD and KL have the best performance among the measures compared.
منابع مشابه
Information Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملGeneralized Baer-Invariant of a Pair of Groups and Marginal Extension
In this paper, we give connection between the order of the generalized Baer-invariant of a pair of finite groups and its factor groups, when ? is considered to be the specific variety. Moreover, we give a necessary and sufficient condition in which the generalized Baer-invariant of a pair of groups can be embedded into the generalized Baer-invariant of pair of its factor groups.
متن کاملComputing the Kullback-Leibler Divergence between two Generalized Gamma Distributions
We derive a closed form solution for the Kullback-Leibler divergence between two generalized gamma distributions. These notes are meant as a reference and provide a guided tour towards a result of practical interest that is rarely explicated in the literature. 1 The Generalized Gamma Distribution The origins of the generalized gamma distribution can be traced back to work of Amoroso in 1925 [1,...
متن کاملTruncated Linear Minimax Estimator of a Power of the Scale Parameter in a Lower- Bounded Parameter Space
Minimax estimation problems with restricted parameter space reached increasing interest within the last two decades Some authors derived minimax and admissible estimators of bounded parameters under squared error loss and scale invariant squared error loss In some truncated estimation problems the most natural estimator to be considered is the truncated version of a classic...
متن کاملScale-Invariant Divergences for Density Functions
Divergence is a discrepancy measure between two objects, such as functions, vectors, matrices, and so forth. In particular, divergences defined on probability distributions are widely employed in probabilistic forecasting. As the dissimilarity measure, the divergence should satisfy some conditions. In this paper, we consider two conditions: The first one is the scale-invariance property and the...
متن کامل